Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix CryptoJS argument passing in DeriveEVPKey #1767

Merged
merged 1 commit into from
Apr 2, 2024
Merged

Conversation

zb3
Copy link
Contributor

@zb3 zb3 commented Apr 1, 2024

As pointed out in #1593, CryptoJS treats strings as Utf8, so for binary strings, Latin1 needs to be used.
This PR should fix that issue, and there are no other instances of raw strings being passed to CryptoJS methods.

CryptoJS treats strings as Utf8, so for binary strings, Latin1 needs to be used.
@@ -62,11 +62,13 @@ class DeriveEVPKey extends Operation {
* @returns {string}
*/
run(input, args) {
const passphrase = Utils.convertToByteString(args[0].string, args[0].option),
const passphrase = CryptoJS.enc.Latin1.parse(
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This feels a little contrived at the moment. Using Latin1 as the encoding scheme for the string does solve the issue, but doesn't feel optimal. Could we instead replace this with a conversion to a byte array and then create a crypto word array from that? E.g:

// Note: Entirely untested.
CryptoJS.lib.WordArray.create(new UInt8Array(Utils.convertToByteArray(args[0].string, args[0].option)))

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That being said, this is a clear improvement over the current broken implementation, so if this isn't a two minute replacement I'm happy to approve and merge in this pull request.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Doesn't WordArray expect an array of 32bit integers? I'm not sure whether new Uint32Array(new Uint8Array(byteArray).buffer) would have the correct endianness

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks like WordArray expects BigEndian version of the buffer.. The Latin1 parse code - at least to me - looks like something as direct as possible given we need an array of 32bit integers represending BE view of the string

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That's fair, I'll approve this then! 👍

@a3957273 a3957273 merged commit 9448106 into gchq:master Apr 2, 2024
4 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants